Abstract
Three different examinations for any course are primarily defined in higher education in Turkey: midterm, final and make-up exams. Whether a student has passed a course is decided by using the scores of midterm and final exams. If this student fails the course as a result of these exams, he can take a make-up exam of this course, and the score of the make-up exam is replaced with the final exam. However, some of the students do not take the make-up exam although it is expected that they take the make-up exam, due to different reasons such as average score, distance, low score of midterm exam, etc. Because the make-up exam plans and schedule have been performed in accordance with the number of students who failed the course, some resources such as the number of classrooms, invigilators, exam papers, toner are wasted. In order to reduce these wastages, we applied artificial neural networks, ANN, trained by different approaches for predicting the number of students taking make-up examinations in this study. In the proposed framework, some features of students and courses have been determined, the data has been collected and ANNs have been trained on these datasets. By using the trained ANNs, each student who fails the course is classified as positive (taking the make-up exam) or negative (not taking the make-up exam). In the experiments, the data of ten different courses are used for training ANNs by random weight network, error back propagation algorithm, some metaheuristic algorithms such as grey wolf optimizer, artificial bee colony, particle swarm optimization, ant colony optimization, etc. The performances of the trained ANNs have been compared with each other by considering training accuracy, testing accuracy, training time. BP achieves the best mean training accuracy on both unnormalized and normalized datasets with 99.36% and 99.7%, respectively. GWO achieves the best mean testing accuracy on both unnormalized and normalized datasets with 80.39% and 82.39%, respectively. Moreover, RWN has the best running time of less than a second for training the ANN on both normalized and unnormalized datasets. The experiments and comparisons show that an ANN-based classifier can be used for determining the number of students taking the make-up exam.
Similar content being viewed by others
References
Aslan S, Ciocca G, Mazzini D, Schettini R (2020) Benchmarking algorithms for food localization and semantic segmentation. Int J Mach Learn Cybern 11:2827–2847
Zhang HG, Yang JF, Jia GM, Han SC, Zhou XR (2020) ELM-MC: multi-label classification framework based on extreme learning machine. Int J Mach Learn Cybern 11:2261–2274
Waszczyszyn Z (2000) Fundamentals of Artificial Neural Networks. In: Waszczyszyn Z (ed) Neural Networks in the Analysis and Design of Structures. Springer Vienna, Vienna, pp 1–51
Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: Theory and applications. Neurocomputing 70:489–501
Zhu H, Tsang ECC, Wang XZ, Ashfaq RAR (2017) Monotonic classification extreme learning machine. Neurocomputing 225:205–213
Koedinger K, Cunningham K, Skogsholm A, Leber B (2008) An Open Repository and Analysis Tools for Fine-grained, Longitudinal Learner Data. In: Proceedings of 1st International Conference on Educational Data Mining, Canada, pp 157–166
Romero C, Ventura S (2013) Data mining in education. Wiley Interdiscip Rev Data Min Knowl Discov 3:12–27
Hsu Y (2015) Student Academic Achievement Prediction by a Fusion Mechanism. In: Proceedings of 7th international conference on education and new learning technologies, Barcelona, pp 1900–1902
Calders T, Pechenizkiy M (2012) Introduction to the special section on educational data mining. ACM SIGKDD Explor Newsl 13:3–6
Cen L, Ruta D, Powell L, Hirsch B, Ng J (2016) Quantitative approach to collaborative learning: performance prediction, individual assessment, and group composition. Int J Comput Support Collab 11:187–225
Tekin A (2014) Early prediction of students’ grade point averages at graduation: a data mining approach. Eurasian J Educ Res 54:207–226
Mubarak AA, Cao H, Zhang W, Zhang W (2020) Visual analytics of video-clickstream data and prediction of learners' performance using deep learning models in MOOCs' courses. Comp Appl Eng Edu 1–23
Costa-Mendes R, Oliveira T, Castelli M, Cruz-Jesus F (2021) A machine learning approximation of the 2015 Portuguese high school student grades: a hybrid approach. Edu Inf Technol 26:1527–1547
Ozkan UB, Cigdem H, Erdogan T (2020) Artificial neural network approach to predict lms acceptance of vocational school students. Turk Online J Distance 21:156–169
Wang XH, Yu XM, Guo L, Liu FG, Xu LC (2020) Student performance prediction with short-term sequential campus behaviors. Information 11(4):201
Musso MF, Hernandez CFR, Cascallar EC (2020) Predicting key educational outcomes in academic trajectories: a machine-learning approach. High Educ 80:875–894
Sağ T, Jalil ZAJ (2021) Vortex search optimization algorithm for training of feed-forward neural network. Int J Mach Learn Cybernet vol 12:1517–1544
Dash RK, Nguyen TN, Cengiz K, Sharma A (2021) Fine-tuned support vector regression model for stock predictions. Neur Comput Appl. 1:15. https://doi.org/10.1007/s00521-021-05842-w
Cinar AC (2020) Training feed-forward multi-layer perceptron artificial neural networks with a tree-seed algorithm. Arab J Sci Eng 45:10915–10938
Turkoglu B, Kaya E (2020) Training multi-layer perceptron with artificial algae algorithm. Eng Sci Technol 23:1342–1350
Heidari AA, Faris H, Mirjalili S, Aljarah I, Mafarja M (2020) Ant Lion Optimizer: Theory, Literature Review, and Application in Multi-layer Perceptron Neural Networks. In: Mirjalili S, Song Dong J, Lewis A (eds). Nature-Inspired Optimizers, Studies in Computational Intelligence, Vol. 811. Springer, Cham, pp 23–46
Zhang X, Wang X, Chen H, Wang D, Fu Z (2020) Improved GWO for large-scale function optimization and MLP optimization in cancer identification. Neural Comput Appl 32:1305–1325
Xu F, Pun C-M, Li H, Zhang Y, Song Y, Gao H (2020) Training feed-forward artificial neural networks with a modified artificial bee colony algorithm. Neurocomputing 416:69–84
Banharnsakun A (2019) Towards improving the convolutional neural networks for deep learning using the distributed artificial bee colony method. Int J Mach Learn Cybern 10:1301–1311
Amirsadri S, Mousavirad SJ, Ebrahimpour-Komleh H (2018) A Levy flight-based grey wolf optimizer combined with back-propagation algorithm for neural network training. Neural Comput Appl 30:3707–3720
Alshamiri AK, Singh A, Surampudi BR (2018) Two swarm intelligence approaches for tuning extreme learning machine. Int J Mach Learn Cybern 9:1271–1283
Faris H, Aljarah I, Mirjalili S (2016) Training feedforward neural networks using multi-verse optimizer for binary classification problems. Appl Intell 45:322–332
Nawi NM, Khan A, Rehman MZ (2013) A new back-propagation neural network optimized with cuckoo search algorithm. In: Murgante B, Misra S, Carlini M, Torre CM, Nguyen H-Q, Taniar D, Apduhan BO, Gervasi O (eds) Computational science and its applications—ICCSA 2013. Springer, Berlin, pp 413–426
Nandy S, Sarkar PP, Das A (2012) Training a feed-forward neural network with artificial bee colony based backpropagation method. Int J Comp Sci Inf Technol 4:1–14
Nawi M, Khan A, Rehman MZ, Aziz MA, Herawan T, Abawajy JH (2014) An accelerated particle swarm optimization based Levenberg Marquardt back propagation algorithm. In: Loo CK, Yap KS, Wong KW, Teoh A, Huang K (eds) Neural information processing. Springer International Publishing, Cham, pp 245–253
Zhu C, Zhao X, (2009) PSO-based neural network model for teaching evaluation. In: Proceeding of the 4th international conference on computer Science & education, Nanning, China, pp 53–55
Alkhasawneh R, Hargraves RH (2014) Developing a hybrid model to predict student first year retention in STEM disciplines using machine learning techniques. J STEM Educ Innov Res 15:35–42
Carrasco MP, Pato MV (2004) A potts neural network heuristic for the class/teacher timetabling problem. In: Resende MGC, de Sousa JP (eds) Metaheuristics: computer decision-making. Springer US, Boston, pp 173–186
Gregori EB, Zhang JJ, Galvan-Fernandez C, Fernandez-Navarro FD (2018) Learner support in MOOCs: identifying variables linked to completion. Comput Educ 122:153–168
Wang BJ, Wang J, Hu GQ (2017) College english classroom teaching evaluation based on particle swarm optimization—extreme learning machine model. Int J Emerg Technol 12:82–97
Acknowledgements
The authors would like to thank Konya Technical University for providing datasets.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Kiran, M.S., Siramkaya, E., Esme, E. et al. Prediction of the number of students taking make-up examinations using artificial neural networks. Int. J. Mach. Learn. & Cyber. 13, 71–81 (2022). https://doi.org/10.1007/s13042-021-01348-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13042-021-01348-y